Implementation of a high-quality Dolby Digital decoder using MMX TM technology

نویسندگان

  • James C. Abel
  • Michael A. Julier
چکیده

Dolby* Digital is a high-quality audio compression format widely used in feature films and, more recently, on DVD. PCs now offer DVD drives, and providing a Dolby Digital decoder in software allows decoding of Dolby Digital to become a baseline capability on the PC. Intel’s MMXTM technology provides instructions that can significantly speed up the execution of the Dolby Digital decoder, freeing up the processor to perform other tasks such as video decoding and/or audio enhancement. A simple port of Dolby Digital to MMX technology using only a 16-bit data type introduces quantization noise that makes the decoder unsatisfactory for high-quality audio. However, MMX technology provides additional flexibility through 32-bit operations which, combined with other software techniques, allows the implementer to increase the audio quality of the decoder while still providing a significant speedup over implementations that do not use MMX technology. Intel has worked closely with Dolby Laboratories to define an implementation of Dolby Digital based on MMX technology that has achieved Dolby’s certification of quality. This paper describes the research performed and the resultant techniques Intel used in creating its Dolby Digital decoder. Introduction Dolby* Digital is a transform-based audio coding algorithm designed to provide data-rate reduction for wide-band signals while maintaining the high quality of the original content [1]. MMXTM technology can be used to provide a processor-efficient implementation of 1 DVD is often referred to as Digital Versatile Disk or Digital Video Disk. Dolby Digital for a PC based on a Pentium® processor with MMX technology. It is important to maintain high audio quality, and Dolby Laboratories has developed a stringent test suite to ensure that a certified decoder indeed provides high quality. In addition, trained listeners evaluate prospective decoders using both test and program material. Only after a decoder has passed both the analytical and subjective tests is the decoder certified. Intel’s MMX instructions operate on 8, 16, and 32 bits. The human ear has an overall dynamic range of 120 dB and an instantaneous dynamic range of 85 dB [2]. The dynamic range of a binary value is 6.0206 dB per bit. Eight bits (48 dB of dynamic range, about that of AM radio) is obviously insufficient for high-quality audio. Sixteen bits (96 dB of dynamic range, as is used on Compact Disks) is usually considered high-quality audio, and we will accept this notion for this paper. However, due to rounding errors during the intermediate calculations, the accuracy at the output of a Dolby Digital decoder is significantly less than the accuracy of the intermediate values (assuming a uniform accuracy throughout the algorithm). This is typical in signal processing algorithms. Using 16 bits of accuracy uniformly through a Dolby Digital decoder is insufficient to pass the test suite. The challenge was to obtain both good execution speed and good audio quality. 32-bit floating-point numbers could be used throughout the data path and only use MMX technology for bit manipulation, but this would not be the most processor-efficient method. MMX technology provides integer operations that are more processor-efficient than existing floating-point operations; we strove to use the MMX instructions as much as possible. The goal of this investigation was to find a minimal CPU implementation at an acceptable audio quality level. If the CPU requirements could be made small enough, Dolby Digital decoding entirely in software Intel Technology Journal Q3 ’97 2 would be feasible, even in combination with other operations (such as video playback). In order to do this, we had to determine the accuracy required in the various stages of the Dolby Digital decoder while maintaining effective use of MMX technology. We found that by using the flexibility of the 16-bit and 32-bit data types in the MMX instruction set, we were able to increase the accuracy of the Dolby Digital decoder significantly beyond that of a simple 16-bit approach with only a small impact on CPU performance. We also found that MMX technology can be used to speed up the bit manipulation, dithering, and downmix sections of the decoder. An additional benefit of performing the audio decode in software is the resultant flexibility possible in the audio subsystem. If the Dolby Digital decoder is in software, it is easier to route the decoded audio to other audio subsystems. For example, simultaneous mixing of the PC’s system sounds (i.e., via the Microsoft Windows Wave Device API) with the decoded audio is possible. Dolby Digital Decoder A block diagram of the Dolby Digital decoder is shown

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Novel Design of a Multi-layer 2:4 Decoder using Quantum- Dot Cellular Automata

The quantum-dot cellular automata (QCA) is considered as an alternative tocomplementary metal oxide semiconductor (CMOS) technology based on physicalphenomena like Coulomb interaction to overcome the physical limitations of thistechnology. The decoder is one of the important components in digital circuits, whichcan be used in more comprehensive circuits such as full adde...

متن کامل

Minimal Word-width Implementation of Dolby Ac-3 and Iso Mpeg Audio Decompression Standards

Digital Audio Broadcasting (DAB) is an emerging technology which is currently experiencing rapid growth following the advent of ETS 300-401 DAB standard in Europe and the ATSC ATV standard in the U.S. Wide-band audio compression is performed by ISO-MPEG layer II for DAB, and by Dolby AC-3 for ATV. To achieve high quality audio, current DSP implementations for such compression algorithms typical...

متن کامل

Digital Signal Processing on MMXTM Technology

Algorithmic-level optimization and programming-level optimization are tightly coupled with each other. Many programmers can optimize the implementation of a specific algorithm using MMX™ technology. However, without algo-rithmic-level optimization, the speed-up of the optimization will be limited. On the other hand, many algorithm developers can optimize the DSP algorithm in terms of the number...

متن کامل

Software Optimization of Video Codecs on Pentium Processor with MMX Technology

A key enabling technology for the proliferation of multimedia PC’s is the availability of fast video codecs, which are the basic building blocks of many new multimedia applications. Since most industrial video coding standards (e.g., MPEG1, MPEG2, H.261, H.263) only specify the decoder syntax, there are a lot of rooms for optimization in a practical implementation. When considering a specific h...

متن کامل

Software Optimization of H.263 Video Encoder on Pentium Processor with MMX Technology

A key enabling technology for the prolikration of multima dia PC’s is the availability of fast video codeca, which are the basic building blocks of many new multimedia applications. Since most industrial video coding standards (e.g., MPEG1, MPEG2, H.261, H.263, etc.) only specify the decoder syntax, there are a lot of moms for optimization in a practical implementation. When considering a speci...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999